14 research outputs found

    Robot Egomotion from the Deformation of Active Contours

    Get PDF
    Traditional sources of information for image-based computer vision algorithms have been points, lines, corners, and recently SIFT features (Lowe, 2004), which seem to represent at present the state of the art in feature definition. Alternatively, the present work explores the possibility of using tracked contours as informative features, especially in applications no

    HMI with Vision System to Control Manipulator by Operator Hand Movement

    No full text

    Robotic Leaf Probing Via Segmentation of Range Data Into Surface Patches

    Get PDF
    Abstract — We present a novel method for the robotized probing of plant leaves using Time-of-Flight (ToF) sensors. Plant images are segmented into surface patches by combining a segmentation of the infrared intensity image, provided by the ToF camera, with quadratic surface fitting using ToF depth data. Leaf models are fitted to the boundaries of the segments and used to determine probing points and to evaluate the suitability of leaves for being sampled. The robustness of the approach is evaluated by repeatedly placing an especially adapted, robot-mounted spad meter on the probing points which are extracted in an automatic manner. The number of successful chlorophyll measurements is counted, and the total time for processing the visual data and probing the plant with the robot is measured for each trial. In case of failure, the underlying causes are determined and reported, allowing a better assessment of the applicability of the method in real scenarios. I

    Learned Vertex Descent: {A} New Direction for {3D} Human Model Fitting

    No full text

    {SMPLicit}: {T}opology-aware Generative Model for Clothed People

    No full text

    C.: Affine epipolar direction from two views of a planar contour

    Get PDF
    Abstract. Most approaches to camera motion estimation from image sequences require matching the projections of at least 4 non-coplanar points in the scene. The case of points lying on a plane has only recently been addressed, using mainly projective cameras. We here study what can be recovered from two uncalibrated views of a planar contour under affine viewing conditions. We prove that the affine epipolar direction can be recovered provided camera motion is free of cyclorotation. The proposed method consists of two steps: 1) computing the affinity between two views by tracking a planar contour, and 2) recovering the epipolar direction by solving a second-order equation on the affinity parameters. Two sets of experiments were performed to evaluate the accuracy of the method. First, synthetic image streams were used to assess the sensitivity of the method to controlled changes in viewing conditions and to image noise. Then, the method was tested under more realistic conditions by using a robot arm to obtain calibrated image streams, which permit comparing our results to ground truth.
    corecore